8 research outputs found

    Towards Universal Probabilistic Programming with Message Passing on Factor Graphs

    Get PDF

    Towards Universal Probabilistic Programming with Message Passing on Factor Graphs

    Get PDF

    Adaptive Importance Sampling Message Passing

    No full text
    The aim of Probabilistic Programming (PP) is to automate inference in probabilistic models. One efficient realization of PP-based inference concerns variational message passing-based (VMP) inference in a factor graph. VMP is efficient but in principle only leads to closed-form update rules in case the model consists of conjugate and/or conditionally conjugate factor pairs. Recently, Extended Variational Message Passing (EVMP) has been proposed to broaden the applicability of VMP by importance sampling-based particle methods for non-linear and non-conjugate factor pairs. EVMP automates the importance sampling procedure by employing forward messages as proposal distributions, which unfortunately may lead to inaccurate estimation results and numerical instabilities in case the forward message is not a good representative of the unknown correct posterior. This paper addresses this issue by integrating an adaptive importance sampling procedure with message passing-based inference. The resulting method is a hyperparameter-free approximate inference engine that combines recent advances in stochastic adaptive importance sampling and optimization methods. We provide an implementation for the proposed method in the Julia package ForneyLab.jl

    Extended Variational Message Passing for Automated Approximate Bayesian Inference

    Get PDF
    Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models

    Gaussian Process-based Amortization of Variational Message Passing Update Rules

    Get PDF
    Variational Message Passing facilitates automated variational inference in factorized probabilistic models where connected factors are conjugate pairs. Conjugate-computation Variational Inference (CVI) extends the applicability of VMP to models comprising both conjugate and non-conjugate factors. CVI makes use of a gradient that is estimated by Monte Carlo (MC) sampling, which potentially leads to substantial computational load. As a result, for models that feature a large number of non-conjugate pairs, CVI-based inference may not scale well to larger model sizes. In this paper, we propose a Gaussian Process-enhanced CVI approach, called GP-CVI, to amortize the computational costs caused by the MC sampling procedures in CVI. Specifically, we train a Gaussian process regression (GPR) model based on a set of incoming outgoing message pairs that were generated by CVI. In operation, we use the “cheaper” GPR model to produce outgoing messages and resort to the more accurate but expensive CVI message only if the variance of the outgoing message exceeds a threshold. By experimental validation, we show that GP-CVI gradually uses more fast memory-based update rule computations, and less sampling-based update rule computations. As a result, GP-CVI speeds up CVI with a controllable effect on the accuracy of the inference procedure

    Probabilistic programming with stochastic variational message passing

    Get PDF
    Stochastic approximation methods for variational inference have recently gained popularity in the probabilistic programming community since these methods are amenable to automation and allow online, scalable, and universal approximate Bayesian inference. Unfortunately, common Probabilistic Programming Languages (PPLs) with stochastic approximation engines lack the efficiency of message passing-based inference algorithms with deterministic update rules such as Belief Propagation (BP) and Variational Message Passing (VMP). Still, Stochastic Variational Inference (SVI) and Conjugate-Computation Variational Inference (CVI) provide principled methods to integrate fast deterministic inference techniques with broadly applicable stochastic approximate inference. Unfortunately, implementation of SVI and CVI necessitates manually driven variational update rules, which does not yet exist in most PPLs. In this paper, we cast SVI and CVI explicitly in a message passing-based inference context. We provide an implementation for SVI and CVI in ForneyLab, which is an automated message passing-based probabilistic programming package in the open source Julia language. Through a number of experiments, we demonstrate how SVI and CVI extends the automated inference capabilities of message passing-based probabilistic programming

    Message passing-based system identification for NARMAX models

    No full text
    We present a variational Bayesian identification procedure for polynomial NARMAX models based on message passing on a factor graph. Message passing allows us to obtain full posterior distributions for regression coefficients, precision parameters and noise instances by means of local computations distributed according to the factorization of the dynamic model. The posterior distributions are important to shaping the predictive distribution for outputs, and ultimately lead to superior model performance during 1-step ahead prediction and simulation

    The Switching Hierarchical Gaussian Filter

    No full text
    In this paper we discuss variational message passing-based (VMP) inference in a switching Hierarchical Gaussian Filter (HGF). An HGF is a flexible hierarchical state space model that supports closed-form VMP-based approximate inference for tracking of both states and slowly time-varying parameters. Since natural signals often submit to regime-switching dynamics, there is a need for low-complexity closed-form inference in switching state space models. Here we extend the HGF model with parameter switching mechanics and derive closed-form VMP update rules for plug-in applications in factor graph-based models. These VMP rules support both tracking of latent variables and variational free energy as a model performance measure. We show that the switching HGF performs better than a non-switching HGF on modelling of a stock market data set
    corecore